Byzantine-Resilient Decentralized Stochastic Gradient Descent

نویسندگان

چکیده

Decentralized learning has gained great popularity to improve efficiency and preserve data privacy. Each computing node makes equal contribution collaboratively learn a Deep Learning model. The elimination of centralized Parameter Servers (PS) can effectively address many issues such as privacy, performance bottleneck single-point-failure. However, how achieve Byzantine Fault Tolerance in decentralized systems is rarely explored, although this problem been extensively studied systems. In paper, we present an in-depth study towards the resilience with two contributions. First, from adversarial perspective, theoretically illustrate that attacks are more dangerous feasible systems: even one malicious participant arbitrarily alter models other participants by sending carefully crafted updates its neighbors. Second, defense propose Ubar, novel algorithm enhance Tolerance. Specifically, Ubar provides U niform xmlns:xlink="http://www.w3.org/1999/xlink">B yzantine-resilient xmlns:xlink="http://www.w3.org/1999/xlink">A ggregation xmlns:xlink="http://www.w3.org/1999/xlink">R ule for benign nodes select useful parameter filter out ones each training iteration. It guarantees system train correct model under very strong arbitrary number faulty nodes. We conduct extensive experiments on standard image classification tasks results indicate defeat both simple sophisticated higher than existing solutions.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Byzantine Stochastic Gradient Descent

This paper studies the problem of distributed stochastic optimization in an adversarial setting where, out of the m machines which allegedly compute stochastic gradients every iteration, an α-fraction are Byzantine, and can behave arbitrarily and adversarially. Our main result is a variant of stochastic gradient descent (SGD) which finds ε-approximate minimizers of convex functions in T = Õ ( 1...

متن کامل

ByRDiE: Byzantine-resilient distributed coordinate descent for decentralized learning

Distributed machine learning algorithms enable processing of datasets that are distributed over a network without gathering the data at a centralized location. While efficient distributed algorithms have been developed under the assumption of faultless networks, failures that can render these algorithms nonfunctional indeed happen in the real world. This paper focuses on the problem of Byzantin...

متن کامل

Asynchronous Decentralized Parallel Stochastic Gradient Descent

Recent work shows that decentralized parallel stochastic gradient decent (D-PSGD) can outperform its centralized counterpart both theoretically and practically. While asynchronous parallelism is a powerful technology to improve the efficiency of parallelism in distributed machine learning platforms and has been widely used in many popular machine learning softwares and solvers based on centrali...

متن کامل

Robust Decentralized Differentially Private Stochastic Gradient Descent

Stochastic gradient descent (SGD) is one of the most applied machine learning algorithms in unreliable large-scale decentralized environments. In this type of environment data privacy is a fundamental concern. The most popular way to investigate this topic is based on the framework of differential privacy. However, many important implementation details and the performance of differentially priv...

متن کامل

On Nonconvex Decentralized Gradient Descent

Consensus optimization has received considerable attention in recent years. A number of decentralized algorithms have been proposed for convex consensus optimization. However, on consensus optimization with nonconvex objective functions, our understanding to the behavior of these algorithms is limited. When we lose convexity, we cannot hope for obtaining globally optimal solutions (though we st...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Transactions on Circuits and Systems for Video Technology

سال: 2022

ISSN: ['1051-8215', '1558-2205']

DOI: https://doi.org/10.1109/tcsvt.2021.3116976